Quantifying Generalization in Linearly Weighted Neural Networks

نویسندگان

  • Martin Anthony
  • Sean B. Holden
چکیده

Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural networks. First , we obtain bounds on the Vapnik-Chervonenkis dimensions of radial basis function networks with basis functions of several types. Secondly, we calculate the VapnikChervonenkis dimension of polynomial discriminant funct ions defined over both real and binary-valued inputs.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantifying Generalization in Linearly Weighted Neural Networks (Short title: Quantifying Generalization)

The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generalization in artificial neural networks. The ‘probably approximately correct’ learning framework is described and the importance of the VC dimension is illustrated. We then investigate the VC dimension of certain types of linearly weighted neural networks. First, we obtain bounds on the VC dimension...

متن کامل

Monitoring of Regional Low-Flow Frequency Using Artificial Neural Networks

Ecosystem of arid and semiarid regions of the world, much of the country lies in the sensitive and fragile environment Canvases are that factors in the extinction and destruction are easily destroyed in this paper, artificial neural networks (ANNs) are introduced to obtain improved regional low-flow estimates at ungauged sites. A multilayer perceptron (MLP) network is used to identify the funct...

متن کامل

SGD Learns Over-parameterized Networks that Provably Generalize on Linearly Separable Data

Neural networks exhibit good generalization behavior in the over-parameterized regime, where the number of network parameters exceeds the number of observations. Nonetheless, current generalization bounds for neural networks fail to explain this phenomenon. In an attempt to bridge this gap, we study the problem of learning a two-layer over-parameterized neural network, when the data is generate...

متن کامل

Generalization ability of optimal cluster separation networks

Optimal separation of two clusters of normalized vectors can be performed in a neural network with adjustable threshold and weights, which is trained to maximum stability~ Generalization from arbitrarily selected training clusters to a given bipartitioning of input space is studied. The network's threshold becomes a global optimization (and order) parameter. This causes the generalization ahili...

متن کامل

Generalization and capacity of extensively large two-layered perceptrons.

The generalization ability and storage capacity of a treelike two-layered neural network with a number of hidden units scaling as the input dimension is examined. The mapping from the input to the hidden layer is via Boolean functions; the mapping from the hidden layer to the output is done by a perceptron. The analysis is within the replica framework where an order parameter characterizing the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Complex Systems

دوره 8  شماره 

صفحات  -

تاریخ انتشار 1994